Theoretical Analysis and Comparison of Several Criteria on Linear Model Dimension Reduction

نویسندگان

  • Shikui Tu
  • Lei Xu
چکیده

Detecting the dimension of the latent subspace of a linear model, such as Factor Analysis, is a well-known model selection problem. The common approach is a two-phase implementation with the help of an information criterion. Aiming at a theoretical analysis and comparison of different criteria, we formulate a tool to obtain an order of their approximate underestimation-tendencies, i.e., AIC, BIC/MDL, CAIC, BYY-FA(a), from weak to strong under mild conditions, by studying a key statistic and a crucial but unknown indicator set. We also find that DNLL favors cases with slightly dispersed signal and noise eigenvalues. Simulations agree with the theoretical results, and also indicate the advantage of BYY-FA(b) in the cases of small sample size and large noise.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Enhancing Efficiency of Neural Network Model in Prediction of Firms Financial Crisis Using Input Space Dimension Reduction Techniques

The main focus in this study is on data pre-processing, reduction in number of inputs or input space size reduction the purpose of which is the justified generalization of data set in smaller dimensions without losing the most significant data. In case the input space is large, the most important input variables can be identified from which insignificant variables are eliminated, or a variable ...

متن کامل

A Multi Linear Discriminant Analysis Method Using a Subtraction Criteria

Linear dimension reduction has been used in different application such as image processing and pattern recognition. All these data folds the original data to vectors and project them to an small dimensions. But in some applications such we may face with data that are not vectors such as image data. Folding the multidimensional data to vectors causes curse of dimensionality and mixed the differe...

متن کامل

A comparison of generalized linear discriminant analysis algorithms

Linear Discriminant Analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the class separability. However, in undersampled problems where the number of data samples is smaller than the dimension of data space, it is difficult to apply the LDA due to the singularity of scatter matrices caused by high dimensionality. In order to make the LDA ap...

متن کامل

Non-Linear Behavior and Shear Strength of Diagonally Stiffened Steel Plate Shear Walls

In this study, non-linear behavior of diagonally stiffened steel plate shear walls as a seismic resisting system has been investigated, and theoretical formulas for estimating shear strength capacity of the system have been proposed. Several validated analytical finite element models of steel shear walls with various stiffener dimensions are generated to verify and compare the analytical and th...

متن کامل

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009